Quadratic Optimization with Orthogonality Constraints: Explicit Lojasiewicz Exponent and Linear Convergence of Line-Search Methods

نویسندگان

  • Huikang Liu
  • Weijie Wu
  • Anthony Man-Cho So
چکیده

A fundamental class of matrix optimization problems that arise in many areas of science and engineering is that of quadratic optimization with orthogonality constraints. Such problems can be solved using line-search methods on the Stiefel manifold, which are known to converge globally under mild conditions. To determine the convergence rate of these methods, we give an explicit estimate of the exponent in a Lojasiewicz inequality for the (non-convex) set of critical points of the aforementioned class of problems. By combining such an estimate with known arguments, we are able to establish the linear convergence of a large class of line-search methods. A key step in our proof is to establish a local error bound for the set of critical points, which may be of independent interest.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quadratic Optimization with Orthogonality Constraints: Explicit Łojasiewicz Exponent and Linear Convergence of Line-Search Methods

A fundamental class of matrix optimization problems that arise in many areas of science and engineering is that of quadratic optimization with orthogonality constraints. Such problems can be solved using line-search methods on the Stiefel manifold, which are known to converge globally under mild conditions. To determine the convergence rates of these methods, we give an explicit estimate of the...

متن کامل

An affine scaling method for optimization problems with polyhedral constraints

Recently an affine scaling, interior point algorithm ASL was developed for box constrained optimization problems with a single linear constraint (GonzalezLima et al., SIAM J. Optim. 21:361–390, 2011). This note extends the algorithm to handle more general polyhedral constraints. With a line search, the resulting algorithm ASP maintains the global and R-linear convergence properties of ASL. In a...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

Subgradient-based Neural Network for Nonconvex Optimization Problems in Support Vector Machines with Indefinite Kernels

Support vector machines (SVMs) with positive semidefinite kernels yield convex quadratic programming problems. SVMs with indefinite kernels yield nonconvex quadratic programming problems. Most optimization methods for SVMs rely on the convexity of objective functions and are not efficient for solving such nonconvex problems. In this paper, we propose a subgradientbased neural network (SGNN) for...

متن کامل

On the Sequential Quadratically Constrained Quadratic Programming Methods

An iteration of the sequential quadratically constrained quadratic programming method (SQCQP) consists of minimizing a quadratic approximation of the objective function subject to quadratic approximation of the constraints, followed by a linesearch in the obtained direction. Methods of this class are receiving attention due to the development of efficient interior point techniques for solving s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016